Goto

Collaborating Authors

 rotated binary neural network


Rotated Binary Neural Network

Neural Information Processing Systems

Binary Neural Network (BNN) shows its predominance in reducing the complexity of deep neural networks. However, it suffers severe performance degradation. One of the major impediments is the large quantization error between the full-precision weight vector and its binary vector. Previous works focus on compensating for the norm gap while leaving the angular bias hardly touched. In this paper, for the first time, we explore the influence of angular bias on the quantization error and then introduce a Rotated Binary Neural Network (RBNN), which considers the angle alignment between the full-precision weight vector and its binarized version.


Review for NeurIPS paper: Rotated Binary Neural Network

Neural Information Processing Systems

Additional Feedback: Some suggestions going forward Please make a clear introduction of the concepts you're going to talk about. The paper would be so much more readable with a clear explanation of what's angular bias and where it comes from, and about the "flipping of weights". The graphical abstract should be way more intuitive too - a 2D sketch for example could do? I would have liked to see some formatting of the validation like (I know the content is there, but parsing it is harder): XNor-Net XNor-Net Ours Bi-RealNet Bi-RealNet Ours ... I think having a well thought-out optimization is particularly important to properly validate this method. It is mentioned by the authors that the angular bias could be corrected during optimization, but in practice it is shown that this is not the case.


Review for NeurIPS paper: Rotated Binary Neural Network

Neural Information Processing Systems

All four viewers provide favorable or very favorable reviews. The reviewers point out the novel ideas on solving the angular bias in binary neural networks, and point out the positive empirical results. The paper is therefore accepted.


Rotated Binary Neural Network

Neural Information Processing Systems

Binary Neural Network (BNN) shows its predominance in reducing the complexity of deep neural networks. However, it suffers severe performance degradation. One of the major impediments is the large quantization error between the full-precision weight vector and its binary vector. Previous works focus on compensating for the norm gap while leaving the angular bias hardly touched. In this paper, for the first time, we explore the influence of angular bias on the quantization error and then introduce a Rotated Binary Neural Network (RBNN), which considers the angle alignment between the full-precision weight vector and its binarized version.